321 research outputs found

    Comparative study of central decision makers versus groups of evolved agents trading in equity markets

    Get PDF
    This paper investigates the process of deriving a single decision solely based on the decisions made by a population of experts. Four different amalgamation processes are studied and compared among one another, collectively referred to as central decision makers. The expert, also referred to as reference, population is trained using a simple genetic algorithm using crossover, elitism and immigration using historical equity market data to make trading decisions. Performance of the trained agent population’s elite, as determined by results from testing in an out-of-sample data set, is also compared to that of the centralized decision makers to determine which displays the better performance. Performance was measured as the area under their total assets graph over the out-of-sample testing period to avoid biasing results to the cut off date using the more traditional measure of profit. Results showed that none of the implemented methods of deriving a centralized decision in this investigation outperformed the evolved and optimized agent population. Further, no difference in performance was found between the four central decision makersAgents, Decision Making, Equity Market Trading, Genetic Algorithms, Technical Indicators

    Integrative analysis of large-scale biological data sets

    Get PDF
    We present two novel web-applications for microarray and gene/protein set analysis, ArrayMining.net and TopoGSA. These bioinformatics tools use integrative analysis methods, including ensemble and consensus machine learning techniques, as well as modular combinations of different analysis types, to extract new biological insights from experimental transcriptomics and proteomics data. They enable researchers to combine related algorithms and datasets to increase the robustness and accuracy of statistical analyses and exploit synergies of different computational methods, ranging from statistical learning to optimization and topological network analysis

    An Idiotypic Immune Network as a Short Term Learning Architecture for Mobile Robots

    Get PDF
    A combined Short-Term Learning (STL) and Long-Term Learning (LTL) approach to solving mobile robot navigation problems is presented and tested in both real and simulated environments. The LTL consists of rapid simulations that use a Genetic Algorithm to derive diverse sets of behaviours. These sets are then transferred to an idiotypic Artificial Immune System (AIS), which forms the STL phase, and the system is said to be seeded. The combined LTL-STL approach is compared with using STL only, and with using a handdesigned controller. In addition, the STL phase is tested when the idiotypic mechanism is turned off. The results provide substantial evidence that the best option is the seeded idiotypic system, i.e. the architecture that merges LTL with an idiotypic AIS for the STL. They also show that structurally different environments can be used for the two phases without compromising transferabilityComment: 13 pages, 5 tables, 4 figures, 7th International Conference on Artificial Immune Systems (ICARIS2008), Phuket, Thailan

    vrmlgen: An R Package for 3D Data Visualization on the Web

    Get PDF
    The 3-dimensional representation and inspection of complex data is a frequently used strategy in many data analysis domains. Existing data mining software often lacks functionality that would enable users to explore 3D data interactively, especially if one wishes to make dynamic graphical representations directly viewable on the web. In this paper we present vrmlgen, a software package for the statistical programming language R to create 3D data visualizations in web formats like the Virtual Reality Markup Language (VRML) and LiveGraphics3D. vrmlgen can be used to generate 3D charts and bar plots, scatter plots with density estimation contour surfaces, and visualizations of height maps, 3D object models and parametric functions. For greater flexibility, the user can also access low-level plotting methods through a unified interface and freely group different function calls together to create new higher-level plotting methods. Additionally, we present a web tool allowing users to visualize 3D data online and test some of vrmlgen's features without the need to install any software on their computer.

    INTELLIGENT TECHNIQUES FOR HANDLING UNCERTAINTY IN THE ASSESSMENT OF NEONATAL OUTCOME

    Get PDF
    Objective assessment of the neonatal outcome of labour is important, but it is a difficult and challenging problem. It is an invaluable source of information which can be used to provide feedback to clinicians, to audit a unit's overall performance, and can guide subsequent neonatal care. Current methods are inadequate as they fail to distinguish damage that occurred during labour from damage that occurred before or after labour. Analysis of the chemical acid-base status of blood taken from the umbilical cord of an infant immediately after delivery provides information on any damage suffered by the infant due to lack of oxygen during labour. However, this process is complex and error prone, and requires expertise which is not always available on labour wards. A model of clinical expertise required for the accurate interpretation of umbilical acid-base status was developed, and encapsulated in a rule-based expert system. This expert system checks results to ensure their consistency, identifies whether the results come from arterial or venous vessels, and then produces an interpretation of their meaning. This 'crisp' expert system was validated, verified and commercially released, and has since been installed at twenty two hospitals all around the United Kingdom. The assessment of umbilical acid-base status is characterised by uncertainty in both the basic data and the knowledge required for its interpretation. Fuzzy logic provides a technique for representing both these forms of uncertainty in a single framework. A 'preliminary' fuzzy-logic based expert system to interpret error-free results was developed, based on the knowledge embedded in the crisp expert system. Its performance was compared against clinicians in a validation test, but initially its performance was found to be poor in comparison with the clinicians and inferior to the crisp expert system. An automatic tuning algorithm was developed to modify the behaviour of the fuzzy model utilised in the expert system. Sub-normal membership functions were used to weight terms in the fuzzy expert system in a novel manner. This resulted in an improvement in the performance of the fuzzy expert system to a level comparable to the clinicians, and superior to the crisp expert system. Experimental work was carried out to evaluate the imprecision in umbilical cord acid-base parameters. This information, in conjunction with fresh knowledge elicitation sessions, allowed the creation of a more comprehensive fuzzy expert system, to validate and interpret all acid-base data. This 'integrated' fuzzy expert system was tuned using the comparison data obtained previously, and incorporated vessel identification rules and interpretation rules, with numeric and linguistic outputs for each. The performance of each of the outputs was evaluated in a rigorous validation study. This demonstrated excellent agreement with the experts for the numeric outputs, and agreement on a par with the experts for the linguistic outputs. The numeric interpretation produced by the fuzzy expert system is a novel single dimensional measure that accurately represents the severity of acid-base results. The development of the crisp and fuzzy expert systems represents a major achievement and constitutes a significant contribution to the assessment of neonatal outcome.Plymouth Postgraduate Medical Schoo

    A novel framework to elucidate core classes in a dataset

    Get PDF
    In this paper we present an original framework to extract representative groups from a dataset, and we validate it over a novel case study. The framework specifies the application of different clustering algorithms, then several statistical and visualisation techniques are used to characterise the results, and core classes are defined by consensus clustering. Classes may be verified using supervised classification algorithms to obtain a set of rules which may be useful for new data points in the future. This framework is validated over a novel set of histone markers for breast cancer patients. From a technical perspective, the resultant classes are well separated and characterised by low, medium and high levels of biological markers. Clinically, the groups appear to distinguish patients with poor overall survival from those with low grading score and better survival. Overall, this framework offers a promising methodology for elucidating core consensus groups from data

    Mimicking the Behaviour of Idiotypic AIS Robot Controllers Using Probabilistic Systems

    Get PDF
    Previous work has shown that robot navigation systems that employ an architecture based upon the idiotypic network theory of the immune system have an advantage over control techniques that rely on reinforcement learning only. This is thought to be a result of intelligent behaviour selection on the part of the idiotypic robot. In this paper an attempt is made to imitate idiotypic dynamics by creating controllers that use reinforcement with a number of different probabilistic schemes to select robot behaviour. The aims are to show that the idiotypic system is not merely performing some kind of periodic random behaviour selection, and to try to gain further insight into the processes that govern the idiotypic mechanism. Trials are carried out using simulated Pioneer robots that undertake navigation exercises. Results show that a scheme that boosts the probability of selecting highly-ranked alternative behaviours to 50% during stall conditions comes closest to achieving the properties of the idiotypic system, but remains unable to match it in terms of all round performance.Comment: 7 pages, 2 figures, 6 tables, 13th World Multi-Conference on Systemics, Cybernetics and Informatics: WMSCI 2009, Orlando, Florida, US
    • …
    corecore